An $n\times n$ matrix is said to have a self-interlacing spectrum if itseigenvalues $\lambda_k$, $k=1,\ldots,n$, are distributed as follows $$\lambda_1>-\lambda_2>\lambda_3>\cdots>(-1)^{n-1}\lambda_n>0. $$ A method for constructing sign definite matrices with self-interlacingspectra from totally nonnegative ones is presented. We apply this method tobidiagonal and tridiagonal matrices. In particular, we generalize a result byO. Holtz on the spectrum of real symmetric anti-bidiagonal matrices withpositive nonzero entries.
展开▼